ECE 6605 - Information Theory

Prof. Matthieu Bloch

Tuesday, December 5, 2023

Announcements

  • Project
    • Send me an email to confirm your choices!
    • I'll hold project office hours to help you
    • This is meant to be fun
  • Exam options
    • Oral: list of problems published, scheduling options coming
    • Written: Thursday, December 14 11:20 am - 2:10 pm
  • Last time
    • Secret key generation
    • Covert communications
  • Today
    • Information theory beyond this class

Other topics in information theory

  • Multi-user information theory
    • Abbas El~Gamal and Young-Han Kim, Network Information Theory, Cambridge University Press
  • Quantum information theory
    • Mark M. Wilde, Quantum Information Theory, Cambridge University Press
  • Information theory and applied probability
    • Maxim Raginsky and Igal Sason, Concentration of Measure Inequalities in Information Theory, Communications, and Coding, Foundations and Trends in Communications and Information Theory
  • Information theory and machine learning
    • Hellstrom et al,, Generalization Bounds: Perspectives from Information Theory and PAC-Bayes, arXiv
  • Information theory an biology
    • Nakano et al., Molecular Communications, Cambridge University Press

Lossy Source coding with side information

Image
Lossy source coding with side information
  • What if we only have limited rate for communication?
    • Lossy source coding: we can't reconstruct exactly
    • Need to accept some distortion: \(d:\calX\times\calU\to[0,d_{\max}]\), \(d(X^n,U^n)=\frac{1}{n}\sum_{i=1}^nd(X_i,U_i)\)
    • Require \(\lim_{n\to\infty}\E{d(^n,U^n)}\leq D\)

The rate–distortion function for \(X\) with side information \(Y\) available at the decoder is \[ R(D) = \min(\mathbb{I}(X;U)-\mathbb{I}(Y;U)) \] where \(U-X-Y\) and the minimum is over all \(p_{U|X}\) and functions \(\widehat{x}(u,y)\) such that \(\E{d(X,\hat{X})}\leq D\) and \(D_\min = \min_{\widehat{x}(y)}\E{d(X,\hat{x}(Y))}\)